Conditional Log-Likelihood for Continuous Time Bayesian Network Classifiers
نویسندگان
چکیده
Continuous time Bayesian network classifiers are designed for analyzing multivariate streaming data when time duration of events matters. New continuous time Bayesian network classifiers are introduced while their conditional log-likelihood scoring function is developed. A learning algorithm, combining conditional log-likelihood with Bayesian parameter estimation is developed. Classification accuracy values achieved on synthetic and real data by continuous time and dynamic Bayesian network classifiers are compared. Numerical experiments show that the proposed approach outperforms dynamic Bayesian network classifiers and continuous time Bayesian network classifiers learned with log-likelihood.
منابع مشابه
CTBNCToolkit: Continuous Time Bayesian Network Classifier Toolkit
Continuous time Bayesian network classifiers are designed for temporal classification of multivariate streaming data when time duration of events matters and the class does not change over time. This paper introduces the CTBNCToolkit: an open source Java toolkit which provides a stand-alone application for temporal classification and a library for continuous time Bayesian network classifiers. C...
متن کاملBayesian Conditional Gaussian Network Classifiers with Applications to Mass Spectra Classification
Classifiers based on probabilistic graphical models are very effective. In continuous domains, maximum likelihood is usually used to assess the predictions of those classifiers. When data is scarce, this can easily lead to overfitting. In any probabilistic setting, Bayesian averaging (BA) provides theoretically optimal predictions and is known to be robust to overfitting. In this work we introd...
متن کاملDiscriminative Learning of Bayesian Networks via Factorized Conditional Log-Likelihood
We propose an efficient and parameter-free scoring criterion, the factorized conditional log-likelihood (f̂CLL), for learning Bayesian network classifiers. The proposed score is an approximation of the conditional log-likelihood criterion. The approximation is devised in order to guarantee decomposability over the network structure, as well as efficient estimation of the optimal parameters, achi...
متن کاملEfficient Approximation of the Conditional Relative Entropy with Applications to Discriminative Learning of Bayesian Network Classifiers
We propose a minimum variance unbiased approximation to the conditional relative entropy of the distribution induced by the observed frequency estimates, for multi-classification tasks. Such approximation is an extension of a decomposable scoring criterion, named approximate conditional log-likelihood (aCLL), primarily used for discriminative learning of augmented Bayesian network classifiers. ...
متن کاملBayesian network classifiers which perform well with continuous attributes: Flexible classifiers
When modelling a probability distribution with a Bayesian network, we are faced with the problem of how to handle continuous variables. Most previous works have solved the problem by discretizing them with the consequent loss of information. Another common alternative assumes that the data are generated by a Gaussian distribution (parametric approach), such as conditional Gaussian networks, wit...
متن کامل